Dynamically Regularized Fast Recursive Least Squares

نویسنده

  • Steven L. Gay
چکیده

This paper introduces a dynamically regularized fast recursive least squares (DR-FRLS) adaptive filtering algorithm. Numerically stabilized FRLS algorithms exhibit reliable and fast convergence with low complexity even when the excitation signal is highly self-correlated. FRLS still suffers from instability, however, when the condition number of the implicit excitation sample covariance matrix is very high. DR-FLRS, overcomes this problem with a regularization process which only increases the computational complexity by 50%. The benefits of regularization include: 1) the ability to use small forgetting factors resulting in improved tracking ability and 2) better convergence over the standard regularization technique of noise injection. Also, DR-FLRS allows the degree of regularization to be modified quickly without restarting the algorithm.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

SPARLS: A Low Complexity Recursive $\mathcal{L}_1$-Regularized Least Squares Algorithm

We develop a Recursive L1-Regularized Least Squares (SPARLS) algorithm for the estimation of a sparse tap-weight vector in the adaptive filtering setting. The SPARLS algorithm exploits noisy observations of the tap-weight vector output stream and produces its estimate using an ExpectationMaximization type algorithm. Simulation studies in the context of channel estimation, employing multipath wi...

متن کامل

Adaptive and Weighted Collaborative Representations for image classification

Recently, (Zhang et al., 2011) proposed a classifier based on collaborative representations (CR) with regularized least squares (CRC-RLS) for image face recognition. CRC-RLS can replace sparse representation (SR) based classification (SRC) as a simple and fast alternative. With SR resulting from an l1-regularized least squares decomposition, CR starts from an l2-regularized least squares formul...

متن کامل

Extended fast fixed order RLS adaptive filters

The existing derivations of conventional fast RLS adaptive filters are intrinsically dependent on the shift structure in the input regression vectors. This structure arises when a tapped-delay line (FIR) filter is used as a modeling filter. In this paper, we show, unlike what original derivations may suggest, that fast fixed-order RLS adaptive algorithms are not limited to FIR filter structures...

متن کامل

Asymptotics of Gaussian Regularized Least Squares

We consider regularized least-squares (RLS) with a Gaussian kernel. We prove that if we let the Gaussian bandwidth σ → ∞ while letting the regularization parameter λ→ 0, the RLS solution tends to a polynomial whose order is controlled by the rielative rates of decay of 1 σ2 and λ: if λ = σ−(2k+1), then, as σ →∞, the RLS solution tends to the kth order polynomial with minimal empirical error. We...

متن کامل

Convex regularized recursive maximum correntropy algorithm

In this brief, a robust and sparse recursive adaptive filtering algorithm, called convex regularized recursive maximum correntropy (CR-RMC), is derived by adding a general convex regularization penalty term to the maximum correntropy criterion (MCC). An approximate expression for automatically selecting the regularization parameter is also introduced. Simulation results show that the CR-RMC can...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1999